专利摘要:
Point detection system for the pole and mouth of the receptacle, progressive automation of aerial refueling with boom and refueling procedure. System for detecting the tip of the pole boom of a tanker aircraft and the mouth of the receiver's receptacle for semi-automatic or automatic contact for in-flight refueling with a boom that does not require signaling devices on the receiving aircraft, where the system and associated procedure is robust and ensures to provide the tanker boom control system, robust, reliable and simultaneous information in real time, the end of its pole and the mouth of the receptor receptacle, at every moment. For this the system has: 1) light emitters mounted on the tip of its pole, 2) a subsystem of processing and 3) two 3d cameras and either a tof camera or a doe type, (or both) in addition to, at least, an l laser to provide them with their specific functionality. (Machine-translation by Google Translate, not legally binding)
公开号:ES2584554A1
申请号:ES201531734
申请日:2015-11-30
公开日:2016-09-28
发明作者:Alberto ADARVE LOZANO
申请人:Defensya Ingenieria Internacional SL;
IPC主号:
专利说明:

image 1
image2
image3
image4
image5
image6


processors, both conventional type, that execute instructions sequentially (such as those multi-core processors or fpga-s (Field Programmable Gate Array) and gpu-s (graphics processor units) and other with other network-based processors Neurals with training and parallel processing capabilities, and the P element also consists of a communications subsystem with the rest of the subsystems that make up the invention.The functions of the P element consist in obtaining on the one hand the position of the receiver and on the other the Location of the boom based on the information provided by the S3D, STOF and SDOE subsystems, among other results, element P obtains a cloud of points from the receptacle, and parts attached to it, of the receiving aircraft, known that cloud of points and comparing it With information, stored in a database, of the 3D models of the possible airplanes to be contacted, a 3D model of the aircraft can be placed in a virtual space receiver and from it get the exact position of this receptacle. The point cloud is also passed through a previously trained neural network to finally obtain the position of the receptacle (redundantly). The same will be done with the data of the point clouds and the 3D model of the boom. Another function performed by the element P is to determine the positions of the emitters of the element BD of the nozzle of the pole to obtain the position of the end of the latter. Element P calculates all significant points and vectors already indicated. It also performs the adjustment of dimensions and elimination of aberrations from the lenses or the sensor itself. A previous calibration will be essential for the correct operation of the entire system. The components of the element P may be concentrated in a single location or dispersed in parts together with the other subsystems of this invention.
In the first instance, only 3D cameras perform the necessary functionalities. The system would be reduced to two cameras and the BD light emitting device placed at the tip of the pole. All with their corresponding accessories, and to which the processing element P should finally be added.
In a second, more complete implementation, all subsystems are present, although in a first embodiment, the laser that some subsystems use can be the same and the functionality of their cameras performed by one of the 3D cameras or by both.
In successive embodiments, the components of each subsystem become independent and specialize in the task required by each specific subsystem and the entire system adds more individual elements up to
8
image7
image8


Subsystems that integrate it may be dispersed, placed in different zones of the tanker in different implementations of the same patent.
Within element C we have up to three different subsystems, depending on the specific implementation of this patent:
1.- First, a first subsystem that we call S3D (9) that contains the 3D cameras (17) and is responsible for locating the LEDs of the BD element described in point I (figure 1) and determining the position of these issuers against these. It is also responsible for determining the position of the receptacle from the images obtained from the receiving plane on whose surface it is located. These cameras have their respective image sensors, processing electronics, focusing lenses (18) and a narrow B3 pass-band filter centered at a λ3 place in the spectrum. Some cameras may have variable electronic control lenses (19). This wavelength is compatible with the other cameras involved in the refueling operation and is centered on the same emission wavelength of the LEDs (16) of the BD element. This will help in the elimination of photons from other sources such as the sun itself. The additional electronics also have the mission of controlling the lighting of the LEDs over time, generating certain patterns that also help distinguish them from the light emitted by other sources. The processing consists, essentially, in the realization of a cross correlation between the generated light pattern and the light received in each image frame. Finally, this electronics, after detecting each LED emitter of the BD element, which is visible from the cameras, calculates the distance and the rest of the coordinates of each LED with respect to a set of reference axes, which for simplicity are placed in the center of the sensor of one of the cameras and we call CR. The S3D subsystem will be powered by an airplane power supply and will output a set of coordinates (X, Y, Z) of the active points that it locates in each “frame” or picture frame. The processing electronics will cover functionalities such as the detection of coordinates (x, y) of each active point located by each camera independently as well as the calculation of the global coordinates with respect to the reference axes with CR center from (x , and) of both cameras. It will also adjust dimensions and eliminate aberrations from the lenses.
or from the sensor itself. A previous calibration will be essential for their proper functioning.
The distance calculation is performed by each time frame interval, using the images obtained by both cameras at the frequency of obtaining images from them. In addition to identifying a set of points in both, we can obtain by triangulation the distance of each point to them and thus obtain a cloud of points from our receiving plane and our
eleven
image9
image10
image11
image12


to identify the same points in both frame images from both cameras at each moment. From their positions in at least two cameras, and by a triangulation method similar to that used to detect the light emitters in the previous section, the coordinates of all the points identified in all the S3D cameras are obtained. This set of coordinates is neither more nor less than the cloud of points with respect to the CR that is sought to achieve. Note that two sub-clouds of joined points are obtained: One corresponding to the end of the boom and another corresponding to the receiving plane.
- Obtaining a second point cloud, which corresponds again, with the end of the boom and the receiving aircraft from the STOF subsystem, L1 together with the other auxiliary components. Laser L1 provides a set of light pulses of wavelength λ1. The circuit that triggers the ignition of this laser is the same that governs the shooting and acquisition of image frames of the TOF type camera included in STOF. Considering the speed of light and the time it takes to receive the pulse generated in each pixel of the sensor of said TOF type camera, the distance from the stage point that reflects the received light can be obtained. To facilitate this task, a narrow-band B1 pass filter centered on λ1 is placed before the TOF-type camera. In addition, the phase shift technique is used to determine exactly the moment at which the pulse emitted by L1 arrives back at the sensor. That is done for each point of our scenario that is received at each pixel of our sensor in the TOF camera. This gives a new cloud with as many points as the resolution of the sensor used. Each frame time, the TOF camera provides a new cloud of points.
- Obtain a third point cloud that corresponds again, with the end of the boom and the receiving plane from the information provided in a very similar way to the previous one, the SDOE subsystem formed by the DOE type camera plus the laser L2 and other auxiliary components. The L2 laser generates a pattern (this pattern can be fixed or variable as the other laser lenses are controlled) of structured light thanks to the diffraction lens, through which we pass it once properly collimated. The elements of this pattern can be identified if we are able to "see" the light emitted by the laser when it is reflected by our surroundings. To facilitate this, we use a new narrow band B2 pass filter in front of the SDOE camera, tuned to L2 and that will eliminate light from other wavelengths. In addition, switching on and off with a certain cadence will also help us distinguish the laser light from another, from sources
16


different, which will not blink in the same way. With cross-correlation techniques we will obtain the pixels that are reflected in the objects of our scenario and from their relative intensities we will determine which pixels correspond to certain points of the pattern. As a result we obtain a set of points that, again, through triangulation and trigonometry techniques, taking into account that we know the distance of the L2 laser to the SDOE camera and the angles of both, will allow us to obtain the distances from the SDOE camera to each point of that set of points. In short, we will have a set of coordinates (xi, yi, zi) belonging to the objects in our scenario, for each image frame. So again we have a cloud of points similar to that obtained by the STOF camera but from a different way.
- The next step is, alternatively, either to merge the information of the point clouds, for each frame, to obtain the best starting point cloud, or we apply one of the processing methods (which will be explained later) among which P can perform, to each of the point clouds, to perform a fusion of the results obtained and achieve the best and most robust solution of the position of the points and vectors of interest. As stated, all this for each picture frame over time. The calculation of relative velocities and accelerations as well as the orthogonal verses indicated is a purely algebraic issue that requires few processing resources. The processes that we can perform in P to the point clouds obtained by the different elements that make up this invention consist of:
• Pass them through a trained artificial neural network to provide as outputs the coordinates of the location and orthogonal vector of the two points of interest with respect to our CR reference center.
• Compare them with one of the stored 3D models, of our receiver and the boom, to find the position of both the refueling mouth of said receiver and the center of the end of the nozzle (4) of the pole once both are separated. These points with respect to our CR reference center. The great certainty that the BD element provides when obtaining the position of the boom of the boom allows us to eliminate the part of the point cloud corresponding to said tip of the boom and keep the subnubes corresponding exclusively to the receiving plane.
The stages through which the P element passes, in the case of comparing the point clouds with one of the 3D models
17
image13


- 4th.- Perform a merger of the information obtained by alternative methods to obtain the information of interest in a robust and reliable way and thus be able to feed the control laws of the boom and carry out the automatic refueling operation. To perform this task, each subsystem is assigned with the calculation of certain values that are known as quality factors and indicate in essence how reliable the results they have provided or what their probability of error is. This information is used to guarantee the optimal fusion of the results obtained.
The point clouds obtained by the subsystems S3D, SDOE and STOF are used in a hybrid calculation with the two procedures indicated, that is, it will jointly use neural networks and the comparison with a 3D model, to obtain the positions and vectors of interest.
Therefore, thanks to the system and method of this invention, a mechanism for obtaining a data set is provided as a function of time, with negligible latency and an adequate rhythm, to allow the system that governs the control laws of the tanker and of the same boom as well as the receiving aircraft, incorporate said data in its control and thus govern both the tanker, the boom and the receiver to give rise to a contact between the last two semi-automatically or even automatically, supervised or not.
Describing sufficiently the nature of the present invention, as well as the way of putting it into practice, it is noted that, within its essentiality, it may be implemented in other embodiments that differ in detail from that indicated by way of example. , and which will also achieve the protection sought, provided that it does not alter, change or modify its fundamental principle.
19
权利要求:
Claims (1)
[1]
image 1
image2
image3
image4
image5
image6
类似技术:
公开号 | 公开日 | 专利标题
ES2584554A1|2016-09-28|System of detection of tip of the p¿rtiga and mouth of recept¿culo, automation progressive of the air repostaje with bottle and procedure of restoration |
KR20180005659A|2018-01-16|Distance sensor
CN107407553B|2020-07-28|Distance sensor
US10488192B2|2019-11-26|Distance sensor projecting parallel patterns
CN109425305A|2019-03-05|Use the depth measurement of multiple pulsed structured light projection instrument
JP2017532580A5|2018-11-29|
EP3295239B1|2021-06-30|Augmenting a depth map representation with a reflectivity map representation
CN109425306A|2019-03-05|Depth measurement component
US20200182974A1|2020-06-11|Vertical cavity surface emitting laser-based projector
CN109471523A|2019-03-15|Use the eye tracks of eyeball center
JP2016126144A5|2018-02-08|
CN108354585B|2020-10-02|Computer-implemented method for detecting corneal vertex
WO2017051057A1|2017-03-30|System for locating the position of the end of the boom, the mouth of the refuelling vessel and the tanker
ES2603430B2|2017-10-11|SYSTEM OF DETECTION AND PROCEDURE OF POINT CONTACT OF THE FLYING BOTTLE AND NOZZLE OF THE RECEPTACLE FOR AIR BUTTON REPOSTATION OPERATIONS
EA201401149A1|2016-05-31|METHOD OF INCREASING THE ACCURACY OF STARS ORIENTATION DETERMINATION AND LONGER MAINTAINING THE INCREASED ACCURACY OF ORIENTATION DETERMINATION AND DEVICE FOR THEIR IMPLEMENTATION
RU2018102751A|2019-07-30|PURKINJE METER AND AUTOMATIC EVALUATION METHOD
JP2016049258A|2016-04-11|Lighting and imaging device and visual axis detecting apparatus including the same
ES2543038B2|2015-11-26|Spatial location method and system using light markers for any environment
JP2010038847A|2010-02-18|Motion tracker system and coordinate system setting method therefor
AU2017252334B2|2022-03-17|Detection system and method for making contact between the tip of a flying boom and the mouth of a receptacle for aerial refuelling operations with a boom
ES2728787B2|2021-02-09|SYSTEM AND PROCEDURE TO CREATE, MODULATE AND DETECT SHADOWS IN SYSTEMS WITH CONTROL BASED ON A REMOTE VISUALIZATION SYSTEM
JP5262294B2|2013-08-14|Motion tracker device
同族专利:
公开号 | 公开日
AU2016363838A1|2018-04-19|
EP3385907A4|2019-09-04|
EP3385907A1|2018-10-10|
CN108290638A|2018-07-17|
SA518391408B1|2021-11-15|
ES2584554B2|2017-06-13|
WO2017093584A1|2017-06-08|
US20180350104A1|2018-12-06|
US10706583B2|2020-07-07|
EP3385907B1|2022-01-19|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US5530650A|1992-10-28|1996-06-25|Mcdonnell Douglas Corp.|Computer imaging system and method for remote in-flight aircraft refueling|
US5904729A|1997-03-04|1999-05-18|The Boeing Company|Automated director light system for aerial refueling operations|
US20030205643A1|2002-05-01|2003-11-06|Von Thal German|Boom load alleviation using visual means|
US6752357B2|2002-05-10|2004-06-22|The Boeing Company|Distance measuring using passive visual means|
US7469863B1|2005-03-24|2008-12-30|The Boeing Company|Systems and methods for automatically and semiautomatically controlling aircraft refueling|
US7309048B2|2005-07-29|2007-12-18|The Boeing Company|Vision system and method incorporating graphics symbology for use in a tanker refueling system|
EP2336027A1|2009-12-18|2011-06-22|EADS Construcciones Aeronauticas, S.A.|Method and system for enhanced vision in aerial refuelling operations|
EP2336028A1|2009-12-18|2011-06-22|EADS Construcciones Aeronauticas, S.A.|Improved method and system for enhanced vision in aerial refueling operations|
CN103557792B|2013-11-12|2015-10-28|中国科学院自动化研究所|A kind of vision of drogue target is followed the tracks of and location measurement method|
ES2584231B2|2015-10-09|2017-06-02|Defensya Ingeniería Internacional, S.L.|LOCALIZATION SYSTEM FOR THE END OF THE BOOM, THE MOUTH OF THE RECOVERY RECEPTACLE AND THE TANKER|
CN109085845B|2018-07-31|2020-08-11|北京航空航天大学|Autonomous air refueling and docking bionic visual navigation control system and method|ES2743489T3|2015-05-11|2020-02-19|Bae Systems Plc|Aircraft coupling method and system|
CN106710363A|2017-02-25|2017-05-24|佛山市三水区希望火炬教育科技有限公司|Tanker airplane model special for juvenile national defense science and technology studies|
CN110163914B|2018-08-01|2021-05-25|京东方科技集团股份有限公司|Vision-based positioning|
US11022972B2|2019-07-31|2021-06-01|Bell Textron Inc.|Navigation system with camera assist|
法律状态:
2017-06-13| FG2A| Definitive protection|Ref document number: 2584554 Country of ref document: ES Kind code of ref document: B2 Effective date: 20170613 |
优先权:
申请号 | 申请日 | 专利标题
ES201531734A|ES2584554B2|2015-11-30|2015-11-30|SYSTEM FOR DETECTION OF THE POINT OF THE PERMIT AND MOUTH OF RECEPTACLE, PROGRESSIVE AUTOMATION OF THE AIR REPOSTATION WITH BUTTON AND REPOSTATION PROCEDURE|ES201531734A| ES2584554B2|2015-11-30|2015-11-30|SYSTEM FOR DETECTION OF THE POINT OF THE PERMIT AND MOUTH OF RECEPTACLE, PROGRESSIVE AUTOMATION OF THE AIR REPOSTATION WITH BUTTON AND REPOSTATION PROCEDURE|
AU2016363838A| AU2016363838B2|2015-11-30|2016-11-28|System for detecting the tip of the pole and mouth of the receptacle, progressive automation of aerial refuelling with a boom, and refuelling method|
CN201680070182.7A| CN108290638A|2015-11-30|2016-11-28|The system and oiling method of gradual automation air refuelling are carried out when for detecting oiling pipe end and vessel port, flight with winged purlin|
PCT/ES2016/070843| WO2017093584A1|2015-11-30|2016-11-28|System for detecting the tip of the pole and mouth of the receptacle, progressive automation of aerial refuelling with a boom, and refuelling method|
US15/778,017| US10706583B2|2015-11-30|2016-11-28|System and method for aerial refueling|
EP16870036.7A| EP3385907B1|2015-11-30|2016-11-28|System for detecting the tip of the pole and mouth of the receptacle, progressive automation of aerial refuelling with a boom, and refuelling method|
SA518391408A| SA518391408B1|2015-11-30|2018-04-21|System for detecting the tube tip and receptacle mouth, progressive automation of in-flight aerial refuelling with a boom, and refuelling method|
[返回顶部]